Working In Uncertainty

Scorecard with risk shown

Overview

One easy way to promote management that deals with risk/uncertainty better, in an integral way, is to adjust the typical scorecard design to show forecasts for the future, with their uncertainty shown in some way. This encourages the same kind of wide-ranging discussions that are the aim of high level risk registers, but in a situation where all objectives are considered, not just a subset focused on bad things we are trying to minimise.

The Excel spreadsheet below is a template for a scorecard that shows 'risk' in a way that can be presented as a 'risk register', but is not a part of Risk Listing.

The idea is extremely simple. Each performance indicator on the scorecard has information next to it that includes actual results achieved and forecasts about the future. These forecasts include the best estimate, but also the probability of a disappointing result and the probability of a downright problematic result. These last two are the 'risks' if anyone is interested. (The effect is improved further if the set of objectives includes items like safety, health, and reducing bad debt, which are the typical content of risk registers. These should really be part of the main list of objectives so that they get proper consideration along with everything else.)

A second worksheet maps the performance indicators to the plans, projects, practices, controls, or other actions that drive those indicators.

If changes to the current plans are proposed there is space to show what is forecast if the proposed changes are made.

The two key worksheets are:

  • Scorecard that lists the performance indicators and shows the various actual and forecast results.

  • Controls that provides a matrix showing how existing and proposed actions are linked to the performance indicators.

A macro on a button is provided to keep the performance indicators on the Controls sheet aligned with those of the Scorecard (which is the source of this information).

This design can be used at different levels. It could be used to design internal controls around a book-keeping process, in detail. And yet it can also be used to show a company's top level scorecard and provide the Board with an appropriate view of the 'risks' that is simply derived from the performance indicators they have already chosen as 'key'. If anyone wants to know how the performance indicators are thought to be driven by elements of the plan, or how 'risks' are 'controlled', then the Controls matrix provides a compact and compelling account of this, when properly filled in.

The file itself

The Excel file itself is here:

DOWNLOAD Scorecard with risk shown

It was prepared using Excel 2010 and is saved in a macro-enabled file (i.e. ".xlsm") because it contains some code to automate links between the scorecard and the 'risk - control matrix'.

Detailed guidance

Set up and use

You can start by listing performance indicators, or by listing actions (a.k.a. controls). It does not matter which. However, once both are mostly in place you can start linking them.

(The column headings and other key headings are also explained on the spreadsheet in comments. Just put your mouse pointer over a cell with a tiny red mark in its top right corner and the guidance should pop up until you move your mouse pointer away.)

Scorecard

The scorecard is just a list of performance indicators with information about each. Some text has been entered already on the template to give you an idea of what goes where, but you need to delete or type over it.

Each performance indicator needs a unique reference in the 'ref' column. This is used to link the two worksheets so it is very important to avoid changing references unless you are actually changing the indicators.

The first group of columns, headed 'Indicator', gives information about the performance indicators.

  • 'Category' is used to categorize the indicators, e.g. as you might for a Balanced Scorecard, with its four perspectives. You can choose any values you like.

  • 'Level' can be used to show what organizational level the indicator is for. You can choose any system of values you like, or hide or delete this column.

  • The 'Performance indicator short name' needs to be short but also memorable so that people reading it can easily remember which indicator it refers to. This short name provides the column headings in the matrix linking performance indicators with actions.

  • The 'Definition' is where you specify precisely how the indicator is defined. This is often missing from scorecards but I offer it here because clear definitions are crucial to understanding what a scorecard means and avoiding a range of common mistakes.

  • Use 'Accountable for indicator' to show whose job it is to be the expert on what is being done to manage each indicator, even if they are not responsible for carrying out all the relevant actions. That person will give an account of activity related to the indicator.

(Note: The remaining columns of Scorecard are automatically populated from the Controls worksheet. Don't make the mistake of trying to type something into them.)

The next group of columns, headed 'Actual results', is for information about actual results so far. This can be done in lots of ways but this template offers a column for the:

  • last month's result; and another for the

  • 12 month rolling average, which is usually a good way to see performance free from seasonal variations.

The next group of columns, headed 'Forecast results with existing plans (including controls)', is where risk starts to appear. The columns are for forecast results on each indicator, under current plans (or controls if you prefer), expressed in three ways.

  • 'Best estimate for 3m ahead' (though it could be another time horizon if you prefer, and perhaps a different kind of mid-point, such as the probability-weighted average.)

  • 'Probability of disappointment' which can be seen as a 'risk' if you like. This is for the probability of the actual result (in 3 months, say) being below some level defined elsewhere (the Definitions worksheet). The probability could be nothing more than a gut feeling, or the gut feelings of a group averaged, or something calculated from past experience.

  • 'Probability of serious problem' which can be seen as another 'risk', more serious than the probability of disappointment, or as another number characterising the 'risk' represented by the forecast probability distribution of the performance indicator. Again, 'serious problem' is to be defined on the Definitions worksheet for each performance indicator.

The final group of columns, headed 'Forecast results with proposed changes', is for forecasts given proposed changes to the actions. The details are the same as for the previous group and once again the values come from the Controls sheet automatically.

Controls

This worksheet has more rows and columns than the others so take time to understand how it works. Most of the columns are for information about controls. A little text has been entered already to show you where things go, so you need to delete or type over it.

  • The 'Sort' field is for any text or numeric values you invent to put controls into order. This is useful if you sort your controls into a different order (e.g. by person responsible) and want to put them back into the original order. It is also useful for grouping controls by type, and useful again if you create a library of controls and want to pull out just some for inclusion in your matrix.

  • The 'Control group' field is very important because having a framework of control types helps to ensure you think of controls of all types.

  • The 'Control' field should describe the control or at least give a name that makes it clear which control is meant. Remember that 'control' could be any element of a plan, a design feature, or just about any other conceivable type of action, depending on how you are using this template.

  • The 'Status / Action' column has a different function depending on whether the controls are in the existing plan (i.e. in place or under development) or are changes proposed. For controls currently in the plan show them as either 'In place' or 'Developing'. For control changes proposed show them as either to 'Add', 'Modify', or 'Remove'.

  • The 'To do date' is when the control is to be in place.

  • The 'Control ID' should be unique if you want to use it for something, but is not currently used in the macro.

  • A 'Ref to detail' allows you to include a link, perhaps a hyperlink, to more detail about the control.

  • An 'Owner - development' is a person in charge of developing the control and getting it up and running.

  • The 'Development cost' is the other major cost to consider.

  • An 'Owner - operation' is a person in charge of operating the control.

  • The 'Frequency' is when the control is carried out, which could be event driven (e.g. 'every product launch') or time scheduled (e.g. 'weekly').

  • The 'Sampling' field explains which items the control is to be applied to. Sometimes this will be a sample taken in a particular way from a particular population, but often it will be ALL items in the population.

  • The 'Operation cost' field is for the cost to operate the control, per unit time, or whenever executed perhaps.

  • 'Performance evidence' is how you know the control has been performed.

  • The 'Performance rating' could be used for some kind of summary assessment of how well the control is performing, or at least how well it is being carried out.

'KPI ref' and 'Short name' refer to the column headings for the remaining columns. These headings are populated automatically by the macro. The values to go in the matrix of cells below them need careful thought, as explained a little later.

The heading 'ACTIONS/CONTROLS CURRENTLY IN PLACE AND UNDER DEVELOPMENT' marks the start of a list of actions/controls showing what is currently reflected in the forecasts, which should be actions/controls already in place or planned.

The heading 'CHANGES PROPOSED' marks the start of a list of changes to those existing actions/controls, which can be either adding items or removing them. If you want to change a control, Remove the control as one change then Add the revised control as the next change.

Once you have some controls/actions listed the next step is to show which of the performance indicators will be affected (positively or negatively) by each control/action. You have some interesting design options here, so give your approach to this some careful thought.

The simplest approach is to just to indicate where a control affects a performance indicator (a.k.a. 'risk') but not say by how much. Simply enter 1 where there is a link and 0 otherwise. If you take this approach and one of your changes is to Remove that control then the effect of the Remove should be shown by entering -1 to cancel out each 1.

The more sophisticated alternative is to try to show how much effect the control has on the performance indicator. Your approach to this needs to be carefully worked out.

Whichever approach you take the 'Count' and 'Extended count' numbers are crude summaries. The Count number just adds the effects in the column above. The Extended count number takes the Count and adds to it the sum of the effects in the second matrix. This does not give you the forecasts needed by the Scorecard worksheet. These have to be entered by hand as explained next.

The 'Best estimate for 3m ahead', 'Probability of disappointment', and 'Probability of serious problem' values have to be entered by hand (or formulae included to drag them from somewhere else). The reason they should be entered on the Controls sheet rather then directly on the Scorecard is that forecasts should be made with plans in mind, not produced by plucking numbers out of the air. The Count and Extended count numbers may be some help in spotting where there is an obvious mis-match between the actions and the forecasts.

Note that the Probability of disappointment will never be less than the Probability of serious problem because Probability of disappointment really means the probability of disappointment or worse.

Definitions

The probability of 'disappointment' or a 'serious problem' depends on how you define those terms. On this worksheet you need to define those terms for each performance indicator (PI). Try to choose levels that will be equally serious for each performance indicator, and choose levels that provide informative probabilities.

Identify the 'PI' in some way in the first column.

In the column for Definition of 'disappointment' describe a level of performance that would be disappointing enough for people to want to avoid it, but would not be a serious problem.

In the column for Definition of 'serious problem' describe a level of performance that would be a serious problem. This should be much worse than just disappointing.

Some examples of the sort of definition rules you could use appear on the template.

Tailoring

With the right tweaks you can present this basic template as anything from a detailed design analysis for a book-keeping process to a board level Balanced Scorecard. It's all in the details of wording and your choice of performance indicators and actions/controls.

As with any spreadsheet you can change the design as much as you like. However, this spreadsheet has a useful macro in it that helps keep the performance indicators mentioned on the Controls sheet matched to the ones in the Scorecard. Doing this without the macro could be quite tricky and you might easily make mistakes.

If you want to keep on using the macro then you need to know what you cannot change on the spreadsheet without confusing the macro.

Changing worksheet names

The worksheet names 'Scorecard' and 'Controls' are used in the macro so if you want to change them you also need to edit the macro. However, this is easy because you only have to change the name in one place.

The VBA code for the macro is in Module1. If you have not done anything with macros/VBA before then go to View, Macros, View Macros, and finally Edit. Provided the template is the only Excel file you have open, that should open the VBA editor and show you the code. Scroll to the top and you will see that the top two lines are the ones you want.

Const wkc = "Controls"      ' name of the controls worksheet
Const wks = "Scorecard"     ' name of scorecard worksheet

Just replace the names Controls and/or Scorecard with the names you prefer and do the same for the worksheet tabs themselves. Obviously, do not change the variable names wkc and wks.

Changing column and row headings

The macro identifies certain columns and rows using labels in hidden columns and rows on the Scorecard and Controls worksheets. You can reveal these by highlighting the entire worksheet (click on the small square to the top left of the worksheet area) then formatting cells to unhide rows, then unhide columns.

Some of the labels are in red and this means that they must be present or the macro will not work. The black labels generate warnings but the macro will still work.

You can still change the text of a column heading with one of these special red labels and you can move the position of a column or row that is labelled. However, you need to respect the idea of each label.

The rows and columns for the hidden labels are fixed in the macro and, being hidden, there is no obvious reason why you would want to move them. However, if you have to change them for some reason then you need to make a tiny edit to the VBA code. Get to see the code as described above. The 3rd to 6th rows are the ones you want:

Const wks_tag_row = 3       ' row for column tags on scorecard - normally hidden
Const wks_tag_col = 1       ' column for row tags on scorecard - normally hidden
Const wkc_tag_row = 3       ' row for column tags on controls matrix - normally hidden
Const wkc_tag_col = 1       ' column for row tags on controls matrix - normally hidden

Change the digits to correctly locate the hidden rows and columns once you have made your change. Row and column numbers in Excel start from 1, not zero.

Changing formats

You can play with borders, fills, fonts, etc as much as you like. One point that may be useful to know, however, is that when the macro creates extra columns on the Controls worksheet for newly added performance indicators it copies the format of the whole column immediately to the left of the first performance indicator column on the Controls worksheet.

Advantages of this design

The design of this template is a development of a proposal by Trevor Llanwarne, at the time Government Actuary (in the UK). His idea was for a risk register that would make sense to a board. I have just taken it the next logical step by making it part of a scorecard and putting a second sheet behind to allow mapping of actions to results. There are also many detailed tweaks in my design.

This design provides a document that can be presented as a 'risk register' but it does not lead people into the folly of Risk Listing. Instead, it's a relatively straightforward and sensible monitoring and planning tool that uses forecasts to alert people to potential problems ahead. Instead of managing individual 'risks' you evaluate alternative plans for their effect on all performance indicators.

This is not the whole of risk management, of course, but it's a significant part of it and should satisfy most requirements to have a 'risk register'.






Made in England

 

Words © 2014 Matthew Leitch.